# Weakly Supervised Contrastive Learning
Multilingual E5 Small 4096
A locally sparse global version based on intfloat/multilingual-e5-small, supporting multilingual text embedding models with approximately 4k tokens
Text Embedding
Transformers Supports Multiple Languages

M
efederici
16
0
Multilingual E5 Base Xnli Anli
MIT
A multilingual text embedding model fine-tuned on XNLI and ANLI datasets based on intfloat/multilingual-e5-base, suitable for zero-shot classification tasks
Text Classification
Transformers Supports Multiple Languages

M
mjwong
73
0
Multilingual E5 Large Xnli Anli
MIT
A fine-tuned version of the multilingual-e5-large model on XNLI and ANLI datasets, supporting multilingual zero-shot classification tasks
Text Classification
Transformers Supports Multiple Languages

M
mjwong
20
1
Multilingual E5 Large Xnli
MIT
A multilingual text classification model fine-tuned on the XNLI dataset based on multilingual-e5-large, supporting zero-shot classification in 15 languages
Large Language Model
Transformers Supports Multiple Languages

M
mjwong
21
6
E5 Base Multilingual 4096
E5-base-multilingual-4096 is a locally sparse global version based on intfloat/multilingual-e5-base, supporting multilingual text embedding models that can process up to 4096 tokens.
Text Embedding
Transformers Supports Multiple Languages

E
efederici
340
16
E5 Base Unsupervised
MIT
The E5 Base Unsupervised Model is a text embedding model based on contrastive pre-training, suitable for sentence similarity and transformation tasks.
Text Embedding English
E
intfloat
940
1
E5 Small Unsupervised
MIT
The unsupervised version of E5-small, generating text embeddings through weakly supervised contrastive pre-training, suitable for tasks like text similarity calculation
Text Embedding English
E
intfloat
2,093
0
Featured Recommended AI Models